Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties
نویسندگان
چکیده
The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), I(X,Y), between random variables X,Y, which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI (Ig), depending upon the Gaussian correlation or the correlation between ‘Gaussianized variables’, and a non-Gaussian MI (Ing), coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order p are bounded within a compact set defined by Schwarz-like inequalities, where Ing grows from zero at the ‘Gaussian manifold’ where moments are those of Gaussian distributions, towards infinity at the set’s boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating Ing between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio (snr) variances. We have studied the effect of varying snr on Ig and Ing under several signal/noise scenarios.
منابع مشابه
Minimum Mutual Information and Non-Gaussianity through the Maximum Entropy Method: Estimation from Finite Samples
The Minimum Mutual Information (MinMI) Principle provides the least committed, maximum-joint-entropy (ME) inferential law that is compatible with prescribed marginal distributions and empirical cross constraints. Here, we estimate MI bounds (the MinMI values) generated by constraining sets Tcr comprehended by mcr linear and/or nonlinear joint expectations, computed from samples of N iid outcome...
متن کاملLimitations of state estimation: absolute lower bound of minimum variance estimation/filtering, Gaussianity-whiteness measure (joint Shannon-Wiener entropy), and Gaussianing-whitening filter (maximum Gaussianity-whiteness measure principle)
This paper aims at obtaining performance limitations of state estimation in terms of variance minimization (minimum variance estimation and filtering) using information theory. Two new notions, negentropy rate and Gaussianity-whiteness measure (joint Shannon-Wiener entropy), are proposed to facilitate the analysis. Topics such as Gaussianing-whitening filter (the maximum Gaussianity-whiteness m...
متن کاملDependence, Correlation and Gaussianity in Independent Component Analysis
Independent component analysis (ICA) is the decomposition of a random vector in linear components which are “as independent as possible.” Here, “independence” should be understood in its strong statistical sense: it goes beyond (second-order) decorrelation and thus involves the nonGaussianity of the data. The ideal measure of independence is the “mutual information” and is known to be related t...
متن کاملمحاسبه فاصله عدم قطعیت بر پایه آنتروپی شانون و تئوری دمپستر-شافر از شواهد
Abstract Dempster Shafer theory is the most important method of reviewing uncertainty for information system. This theory as introduced by Dempster using the concept of upper and lower probabilities extended later by Shafer. Another important application of entropy as a basic concept in the information theory can be used as a uncertainty measurement of the system in specific situation In th...
متن کاملMinimax Mutual Information Approach for Independent Component Analysis
Minimum output mutual information is regarded as a natural criterion for independent component analysis (ICA) and is used as the performance measure in many ICA algorithms. Two common approaches in information-theoretic ICA algorithms are minimum mutual information and maximum output entropy approaches. In the former approach, we substitute some form of probability density function (pdf) estima...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Entropy
دوره 14 شماره
صفحات -
تاریخ انتشار 2012